Lava, Lava-DNF and the Loihi Software Ecosystem

Software Ecosystem and the relationship between Magma and Lava

Lava Magma follows a composable coding paradigm.¶

  • Lava architecture "is inspired from the Communicating Sequential Process (CSP) paradigm for asynchronous, parallel systems that interact via message passing." -- Qoute from Lava README.md

Since Lava is "composable", in theory these older Loihi coding paradigms should be interopable:¶

  • Intel: NxSDK Also known as the Loihi Neurocore API
  • SNIPS C code.
  • "The specific components of Magma needed to compile processes specifically to Intel Loihi chips remains proprietary to Intel and is not provided through this GitHub site (see below). Similar Magma-layer code for other future commercial neuromorphic platforms likely will also remain proprietary." -- Qoute from Lava README.md

  • Warning, this NB needs Python-3.8 minimum

Loihi¶

Lava Dynamic Neural Fields Lava-DNF¶

  • DNF is the subset of the Lava paradigm that has a lot of useful patterns for specifying biological connectivity.
  • https://github.com/lava-nc/lava-dnf

Usefulness of Lava-DNF for large scale biological modelling work¶

What we want from an interface¶

  • [x] Means to specify forwards connectivity between populations.
  • [x] Means to specify recurrent connectivity between populations.
  • [x] Ability to define LIF Cell populations.
  • [x] Inhibitory Synapses (negative weight values possible)
  • [x] Capacity to support high cell counts.
  • [ ] Ability to visualize the whole architecture (nothing like TorchViz for ANN architecture yet).
  • [ ] Spike Timing Dependent Plasticity (STDP), or on chip local learning rules (coming).
  • [ ] Delay Learning (probably not even planned)
  • [ ] Adaptive Neurons (supported by SLAYER allegedly composable)
  • [ ] performance profiling (including power consumption). (coming)

To show how I established the tick box above a Lava-DNF code demonstration follows.

To evaluate whether Lava is useful for making bioplausible models, we want to define a variety of overlapping Spiking Neural Network (SNN) architectures, you can think of each one as a weighted directed graph:

$G(V, E)$

Below is a diagram of the Potjan's cortical model. This model can be thought of as the composition of many weighted directed graphs, therefore we will use Lava a supported interface to begin to build a cortical model with the Python Loihi simulator.

Import relevant modules.¶

In [1]:
from lava.lib.dnf.operations.operations import Weights
from lava.lib.dnf.operations.operations import *
from lava.proc.lif.process import LIF
from lava.lib.dnf.inputs.rate_code_spike_gen.process import RateCodeSpikeGen
from lava.lib.dnf.connect.connect import connect
from lava.lib.dnf.operations.operations import Weights
from lava.magma.core.run_configs import Loihi1SimCfg #Loihi simulator, not  Loihi itself.
from lava.magma.core.run_conditions import RunSteps
from lava.proc.monitor.process import Monitor
from lava.proc.monitor.models import PyMonitorModel
from lava.lib.dnf.inputs.gauss_pattern.process import GaussPattern
from lava.lib.dnf.kernels.kernels import MultiPeakKernel
from lava.lib.dnf.utils.plotting import raster_plot

import numpy as np
import elephant

def compute_cv(spikes_population):
    '''
    Compute coefficient of variation on a whole population of spike trains.    
    '''
    train_of_trains = []
    for spike_train in spikes_population.T: 
        train_of_trains.extend(spike_train)
    return elephant.statistics.cv(train_of_trains, axis=0, nan_policy='propagate')

Cortical Specification:¶

These numbers are nominal.¶

  • I chose the following numbers to fit the performance of a resource limited laptop
  • 2 columns.
  • 4 layers
  • 1 excitatory and 1 inhibitory population per layer.
  • 85 cells per population 170 cells per layer.
In [2]:
ncolumns=2

Create layerwise cell populations¶

In [3]:
# Ex excitatory
ly_2_3_ex = np.ndarray((ncolumns),dtype=object)
ly_4_ex = np.ndarray((ncolumns),dtype=object)
ly_5_ex = np.ndarray((ncolumns),dtype=object)
ly_6_ex = np.ndarray((ncolumns),dtype=object)

# In inhibitory
ly_2_3_in = np.ndarray((ncolumns),dtype=object)
ly_4_in = np.ndarray((ncolumns),dtype=object)
ly_5_in = np.ndarray((ncolumns),dtype=object)
ly_6_in = np.ndarray((ncolumns),dtype=object)

ncells = 85
for i in range(0,ncolumns):
    ly_2_3_ex[i] = LIF(shape=(ncells,))
    ly_4_ex[i] = LIF(shape=(ncells,))
    ly_5_ex[i] = LIF(shape=(ncells,))
    ly_6_ex[i] = LIF(shape=(ncells,))


    ly_2_3_in[i] = LIF(shape=(ncells,))
    ly_4_in[i] = LIF(shape=(ncells,))
    ly_5_in[i] = LIF(shape=(ncells,))
    ly_6_in[i] = LIF(shape=(ncells,))

    

Create the connectivity pattern

* Repeated and stereotyped connections whithin and between layers: